首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3544篇
  免费   203篇
  国内免费   14篇
电工技术   37篇
综合类   1篇
化学工业   700篇
金属工艺   73篇
机械仪表   81篇
建筑科学   159篇
矿业工程   1篇
能源动力   113篇
轻工业   238篇
水利工程   33篇
石油天然气   13篇
无线电   365篇
一般工业技术   772篇
冶金工业   423篇
原子能技术   34篇
自动化技术   718篇
  2023年   26篇
  2022年   27篇
  2021年   91篇
  2020年   78篇
  2019年   70篇
  2018年   118篇
  2017年   103篇
  2016年   115篇
  2015年   99篇
  2014年   140篇
  2013年   225篇
  2012年   219篇
  2011年   273篇
  2010年   194篇
  2009年   225篇
  2008年   210篇
  2007年   187篇
  2006年   129篇
  2005年   101篇
  2004年   99篇
  2003年   95篇
  2002年   83篇
  2001年   43篇
  2000年   52篇
  1999年   60篇
  1998年   111篇
  1997年   81篇
  1996年   56篇
  1995年   28篇
  1994年   33篇
  1993年   28篇
  1992年   28篇
  1991年   21篇
  1990年   18篇
  1989年   18篇
  1988年   23篇
  1987年   15篇
  1986年   24篇
  1985年   21篇
  1984年   18篇
  1983年   12篇
  1982年   12篇
  1981年   10篇
  1980年   17篇
  1979年   14篇
  1977年   12篇
  1976年   22篇
  1974年   12篇
  1973年   10篇
  1972年   7篇
排序方式: 共有3761条查询结果,搜索用时 171 毫秒
71.
This paper presents a new algorithm for implementing a reconfigurable distributed shared memory in an asynchronous dynamic network. The algorithm guarantees atomic consistency (linearizability) in all executions in the presence of arbitrary crash failures of the processing nodes, message delays, and message loss. The algorithm incorporates a classic quorum-based algorithm for read/write operations, and an optimized consensus protocol, based on Fast Paxos for reconfiguration, and achieves the design goals of: (i) allowing read and write operations to complete rapidly and (ii) providing long-term fault-tolerance through reconfiguration, a process that evolves the quorum configurations used by the read and write operations. The resulting algorithm tolerates dynamism. We formally prove our algorithm to be correct, we present its performance and compare it to existing reconfigurable memories, and we evaluate experimentally the cost of its reconfiguration mechanism.  相似文献   
72.
Comprehensive Automation for Specialty Crops is a project focused on the needs of the specialty crops sector, with a focus on apples and nursery trees. The project’s main thrusts are the integration of robotics technology and plant science; understanding and overcoming socio-economic barriers to technology adoption; and making the results available to growers and stakeholders through a nationwide outreach program. In this article, we present the results obtained and lessons learned in the first year of the project with a reconfigurable mobility infrastructure for autonomous farm driving. We then present sensor systems developed to enable three real-world agricultural applications—insect monitoring, crop load scouting, and caliper measurement—and discuss how they can be deployed autonomously to yield increased production efficiency and reduced labor costs.  相似文献   
73.
In this paper we study the security of the Advanced Encryption Standard (AES) and AES-like block ciphers against differential cryptanalysis. Differential cryptanalysis is one of the most powerful methods for analyzing the security of block ciphers. Even though no formal proofs for the security of AES against differential cryptanalysis have been provided to date, some attempts to compute the maximum expected differential probability (MEDP) for two and four rounds of AES have been presented recently. In this paper, we will improve upon existing approaches in order to derive better bounds on the EDP for two and four rounds of AES based on a slightly simplified S-box. More precisely, we are able to provide the complete distribution of the EDP for two rounds of this AES variant with five active S-boxes and methods to improve the estimates for the EDP in the case of six active S-boxes.  相似文献   
74.
Vu VQ  Yu B  Kass RE 《Neural computation》2009,21(3):688-703
Information estimates such as the direct method of Strong, Koberle, de Ruyter van Steveninck, and Bialek (1998) sidestep the difficult problem of estimating the joint distribution of response and stimulus by instead estimating the difference between the marginal and conditional entropies of the response. While this is an effective estimation strategy, it tempts the practitioner to ignore the role of the stimulus and the meaning of mutual information. We show here that as the number of trials increases indefinitely, the direct (or plug-in) estimate of marginal entropy converges (with probability 1) to the entropy of the time-averaged conditional distribution of the response, and the direct estimate of the conditional entropy converges to the time-averaged entropy of the conditional distribution of the response. Under joint stationarity and ergodicity of the response and stimulus, the difference of these quantities converges to the mutual information. When the stimulus is deterministic or nonstationary the direct estimate of information no longer estimates mutual information, which is no longer meaningful, but it remains a measure of variability of the response distribution across time.  相似文献   
75.
The paper presents a paradoxical feature of computational systems that suggests that computationalism cannot explain symbol grounding. If the mind is a digital computer, as computationalism claims, then it can be computing either over meaningful symbols or over meaningless symbols. If it is computing over meaningful symbols its functioning presupposes the existence of meaningful symbols in the system, i.e. it implies semantic nativism. If the mind is computing over meaningless symbols, no intentional cognitive processes are available prior to symbol grounding. In this case, no symbol grounding could take place since any grounding presupposes intentional cognitive processes. So, whether computing in the mind is over meaningless or over meaningful symbols, computationalism implies semantic nativism.  相似文献   
76.
Data mining is not an invasion of privacy because access to data is only by machines, not by people: this is the argument that is investigated here. The current importance of this problem is developed in a case study of data mining in the USA for counterterrorism and other surveillance purposes. After a clarification of the relevant nature of privacy, it is argued that access by machines cannot warrant the access to further information, since the analysis will have to be made either by humans or by machines that understand. It concludes that the current data mining violates the right to privacy and should be subject to the standard legal constraints for access to private information by people.
Vincent C. MüllerEmail:
  相似文献   
77.
Fuzzy mining approaches have recently been discussed for deriving fuzzy knowledge. Since items may have their own characteristics, different minimum supports and membership functions may be specified for different items. In the past, we proposed a genetic-fuzzy data-mining algorithm for extracting minimum supports and membership functions for items from quantitative transactions. In that paper, minimum supports and membership functions of all items are encoded in a chromosome such that it may be not easy to converge. In this paper, an enhanced approach is proposed, which processes the items in a divide-and-conquer strategy. The approach is called divide-and-conquer genetic-fuzzy mining algorithm for items with Multiple Minimum Supports (DGFMMS), and is designed for finding minimum supports, membership functions, and fuzzy association rules. Possible solutions are evaluated by their requirement satisfaction divided by their suitability of derived membership functions. The proposed GA framework maintains multiple populations, each for one item’s minimum support and membership functions. The final best minimum supports and membership functions in all the populations are then gathered together to be used for mining fuzzy association rules. Experimental results also show the effectiveness of the proposed approach.  相似文献   
78.
Data classification is an important topic in the field of data mining due to its wide applications. A number of related methods have been proposed based on the well-known learning models such as decision tree or neural network. Although data classification was widely discussed, relatively few studies explored the topic of temporal data classification. Most of the existing researches focused on improving the accuracy of classification by using statistical models, neural network, or distance-based methods. However, they cannot interpret the results of classification to users. In many research cases, such as gene expression of microarray, users prefer the classification information above a classifier only with a high accuracy. In this paper, we propose a novel pattern-based data mining method, namely classify-by-sequence (CBS), for classifying large temporal datasets. The main methodology behind the CBS is integrating sequential pattern mining with probabilistic induction. The CBS has the merit of simplicity in implementation and its pattern-based architecture can supply clear classification information to users. Through experimental evaluation, the CBS was shown to deliver classification results with high accuracy under two real time series datasets. In addition, we designed a simulator to evaluate the performance of CBS under datasets with different characteristics. The experimental results show that CBS can discover the hidden patterns and classify data effectively by utilizing the mined sequential patterns.  相似文献   
79.
Robots acting in human environments usually need to perform multiple motion and force tasks while respecting a set of constraints. When a physical contact with the environment is established, the newly activated force task or contact constraint may interfere with other tasks. The objective of this paper is to provide a control framework that can achieve real-time control of humanoid robots performing both strict and non strict prioritized motion and force tasks. It is a torque-based quasi-static control framework, which handles a dynamically changing task hierarchy with simultaneous priority transitions as well as activation or deactivation of tasks. A quadratic programming problem is solved to maintain desired task hierarchies, subject to constraints. A generalized projector is used to quantitatively regulate how much a task can influence or be influenced by other tasks through the modulation of a priority matrix. By the smooth variations of the priority matrix, sudden hierarchy rearrangements can be avoided to reduce the risk of instability. The effectiveness of this approach is demonstrated on both a simulated and a real humanoid robot.  相似文献   
80.
The objective of this paper is to elucidate an organizational process for the design of generic technologies (GTs). While recognizing the success of GTs, the literature on innovation management generally describes their design according to evolutionary strategies featuring multiple and uncertain trials, resulting in the discovery of common features among multiple applications. This random walk depends on multiple market and technological uncertainties that are considered exogenous: as smart as he can be, the ‘gambler’ must play in a given probability space. However, what happens when the innovator is not a gambler but a designer, i.e., when the actor is able to establish new links between previously independent emerging markets and technologies? Formally speaking, the actor designs a new probability space. Building on a case study of two technological development programmes at the French Center for Atomic Energy, we present cases of GTs that correspond to this logic of designing the probability space, i.e. the logic of intentionally designing common features that bridge the gap between a priori heterogeneous applications and technologies. This study provides another example showing that the usual trial‐and‐learning strategy is not the only strategy to design GTs and that these technologies can be designed by intentionally building new interdependences between markets and technologies. Our main result is that building these interdependences requires organizational patterns that correspond to a ‘design of exploration’ phase in which multiple technology suppliers and application providers are involved in designing both the probability space itself and the instruments to explore and benefit from this new space.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号